Study Finds YouTube's System Sends Gun Videos to 9-year-olds

2023-05-21

00:00 / 00:00
复读宝 RABC v8.0beta 复读机按钮使用说明
播放/暂停
停止
播放时:倒退3秒/复读时:回退AB段
播放时:快进3秒/复读时:前进AB段
拖动:改变速度/点击:恢复正常速度1.0
拖动改变复读暂停时间
点击:复读最近5秒/拖动:改变复读次数
设置A点
设置B点
取消复读并清除AB点
播放一行
停止播放
后退一行
前进一行
复读一行
复读多行
变速复读一行
变速复读多行
LRC
TXT
大字
小字
滚动
全页
1
  • A new study has found that YouTube's tool to suggest videos can direct young users to content about guns and violence.
  • 2
  • The study was based on an experiment carried out by the Tech Transparency Project.
  • 3
  • The nonprofit group studies social media services.
  • 4
  • Researchers from the group set up two YouTube accounts that simulated online activity that might be interesting to 9-year-old boys.
  • 5
  • The two accounts contained the exact same information.
  • 6
  • The only difference was that one account chose only to watch videos suggested by YouTube.
  • 7
  • The other ignored the video service's suggested offerings.
  • 8
  • The organization found the account that chose to watch YouTube's suggestions was flooded with graphic videos.
  • 9
  • These included videos about school shootings and instructions for making guns fully automatic.
  • 10
  • Many of the suggested videos violate YouTube's own policies against violent or graphic content.
  • 11
  • YouTube has technology tools that are meant to restrict some kinds of videos.
  • 12
  • But the study suggests that those tools are failing to block violent content from young users.
  • 13
  • The researchers involved in the study said the tools even may be sending children to videos that include extremist and violent material.
  • 14
  • Katie Paul leads the Tech Transparency Project.
  • 15
  • She said, "Video games are one of the most popular activities for kids. You can play a game like 'Call of Duty' without ending up at a gun shop - but YouTube is taking them there."
  • 16
  • Paul added, "It's not the video games, it's not the kids. It's the algorithms."
  • 17
  • An algorithm is a set of steps that are followed to complete a computing process or problem.
  • 18
  • Social media companies use algorithms to predict what content users might be interested in based on past watch history.
  • 19
  • Algorithm tools suggest that content to users.
  • 20
  • The accounts that clicked on YouTube's suggested videos received 382 different gun-related videos in a single month.
  • 21
  • The accounts that ignored YouTube's suggestions still received some gun-related videos, but only 34 in total.
  • 22
  • A spokeswoman for YouTube defended the platform's protections for children and noted that it requires users under age 17 to get a parent's permission before using their website.
  • 23
  • YouTube says accounts for users younger than 13 are linked to a parental account.
  • 24
  • The company noted that it offers several choices for younger viewers that are "designed to create a safer experience for tweens and teens."
  • 25
  • Activist groups for children have long criticized YouTube for making violent and troubling content easily available to young users.
  • 26
  • They say YouTube sometimes suggests videos that promote gun violence, eating disorders and self-harm.
  • 27
  • In some cases, YouTube has already removed some of the videos that the Tech Transparency Project identified.
  • 28
  • But others remain available.
  • 29
  • Many technology companies depend on computer programs to identify and remove content that violates their rules.
  • 30
  • But Paul said findings from her organization's study show that greater investments and efforts are needed to block such material.
  • 31
  • Justin Wagner is the director of investigations at Everytown for Gun Safety, a gun control activist group.
  • 32
  • He told the AP that without federal legislation, social media companies must do more to enforce their own rules.
  • 33
  • He added, "Children who aren't old enough to buy a gun shouldn't be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities."
  • 34
  • I'm Bryan Lynn.
  • 1
  • A new study has found that YouTube's tool to suggest videos can direct young users to content about guns and violence.
  • 2
  • The study was based on an experiment carried out by the Tech Transparency Project. The nonprofit group studies social media services. Researchers from the group set up two YouTube accounts that simulated online activity that might be interesting to 9-year-old boys.
  • 3
  • The two accounts contained the exact same information. The only difference was that one account chose only to watch videos suggested by YouTube. The other ignored the video service's suggested offerings.
  • 4
  • The organization found the account that chose to watch YouTube's suggestions was flooded with graphic videos. These included videos about school shootings and instructions for making guns fully automatic.
  • 5
  • Many of the suggested videos violate YouTube's own policies against violent or graphic content.
  • 6
  • YouTube has technology tools that are meant to restrict some kinds of videos. But the study suggests that those tools are failing to block violent content from young users. The researchers involved in the study said the tools even may be sending children to videos that include extremist and violent material.
  • 7
  • Katie Paul leads the Tech Transparency Project. She said, "Video games are one of the most popular activities for kids. You can play a game like 'Call of Duty' without ending up at a gun shop - but YouTube is taking them there."
  • 8
  • Paul added, "It's not the video games, it's not the kids. It's the algorithms." An algorithm is a set of steps that are followed to complete a computing process or problem.
  • 9
  • Social media companies use algorithms to predict what content users might be interested in based on past watch history. Algorithm tools suggest that content to users.
  • 10
  • The accounts that clicked on YouTube's suggested videos received 382 different gun-related videos in a single month. The accounts that ignored YouTube's suggestions still received some gun-related videos, but only 34 in total.
  • 11
  • A spokeswoman for YouTube defended the platform's protections for children and noted that it requires users under age 17 to get a parent's permission before using their website.
  • 12
  • YouTube says accounts for users younger than 13 are linked to a parental account. The company noted that it offers several choices for younger viewers that are "designed to create a safer experience for tweens and teens."
  • 13
  • Activist groups for children have long criticized YouTube for making violent and troubling content easily available to young users. They say YouTube sometimes suggests videos that promote gun violence, eating disorders and self-harm.
  • 14
  • In some cases, YouTube has already removed some of the videos that the Tech Transparency Project identified. But others remain available.
  • 15
  • Many technology companies depend on computer programs to identify and remove content that violates their rules. But Paul said findings from her organization's study show that greater investments and efforts are needed to block such material.
  • 16
  • Justin Wagner is the director of investigations at Everytown for Gun Safety, a gun control activist group. He told the AP that without federal legislation, social media companies must do more to enforce their own rules.
  • 17
  • He added, "Children who aren't old enough to buy a gun shouldn't be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities."
  • 18
  • I'm Bryan Lynn.
  • 19
  • Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press.
  • 20
  • _______________________________________________________________
  • 21
  • Words in This Story
  • 22
  • simulate - v. to do or make something that behaves or looks like something real but is not
  • 23
  • graphic - adj. extremely clear and detailed
  • 24
  • promote - v. to urge people to like, buy or use something
  • 25
  • modify - v. to change something in order to improve it
  • 26
  • atrocity - n. an extremely violent and shocking attack
  • 27
  • __________________________________________________________________
  • 28
  • What do you think of this story? We want to hear from you. We have a new comment system. Here is how it works:
  • 29
  • Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.